The new generation of deep photometric surveys requires unprecedentedly precise shape and photometry measurements of billions of galaxies to achieve their main science goals. At such depths, one major limiting factor is the blending of galaxies due to line-of-sight projection, with an expected fraction of blended galaxies of up to 50%. Current deblending approaches are in most cases either too slow or not accurate enough to reach the level of requirements. This work explores the use of deep neural networks to estimate the photometry of blended pairs of galaxies in monochrome space images, similar to the ones that will be delivered by the Euclid space telescope. Using a clean sample of isolated galaxies from the CANDELS survey, we artificially blend them and train two different network models to recover the photometry of the two galaxies. We show that our approach can recover the original photometry of the galaxies before being blended with $sim$7% accuracy without any human intervention and without any assumption on the galaxy shape. This represents an improvement of at least a factor of 4 compared to the classical SExtractor approach. We also show that forcing the network to simultaneously estimate a binary segmentation map results in a slightly improved photometry. All data products and codes will be made public to ease the comparison with other approaches on a common data set.

Photometry of high-redshift blended galaxies using deep learning / Boucaud, Alexandre; Huertas-Company, Marc; Heneka, Caroline; Ishida, Emille E. O.; Sedaghat, Nima; de Souza, Rafael S.; Moews, Ben; Dole, Hervé; Castellano, Marco; Merlin, Emiliano; Roscani, Valerio; Tramacere, Andrea; Killedar, Madhura; Trindade, Arlindo M. M.. - In: MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY. - ISSN 0035-8711. - (2019).

Photometry of high-redshift blended galaxies using deep learning

Marco Castellano;Valerio Roscani;Andrea Tramacere;
2019

Abstract

The new generation of deep photometric surveys requires unprecedentedly precise shape and photometry measurements of billions of galaxies to achieve their main science goals. At such depths, one major limiting factor is the blending of galaxies due to line-of-sight projection, with an expected fraction of blended galaxies of up to 50%. Current deblending approaches are in most cases either too slow or not accurate enough to reach the level of requirements. This work explores the use of deep neural networks to estimate the photometry of blended pairs of galaxies in monochrome space images, similar to the ones that will be delivered by the Euclid space telescope. Using a clean sample of isolated galaxies from the CANDELS survey, we artificially blend them and train two different network models to recover the photometry of the two galaxies. We show that our approach can recover the original photometry of the galaxies before being blended with $sim$7% accuracy without any human intervention and without any assumption on the galaxy shape. This represents an improvement of at least a factor of 4 compared to the classical SExtractor approach. We also show that forcing the network to simultaneously estimate a binary segmentation map results in a slightly improved photometry. All data products and codes will be made public to ease the comparison with other approaches on a common data set.
2019
astro-ph.GA; astro-ph.GA; astro-ph.IM
01 Pubblicazione su rivista::01a Articolo in rivista
Photometry of high-redshift blended galaxies using deep learning / Boucaud, Alexandre; Huertas-Company, Marc; Heneka, Caroline; Ishida, Emille E. O.; Sedaghat, Nima; de Souza, Rafael S.; Moews, Ben; Dole, Hervé; Castellano, Marco; Merlin, Emiliano; Roscani, Valerio; Tramacere, Andrea; Killedar, Madhura; Trindade, Arlindo M. M.. - In: MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY. - ISSN 0035-8711. - (2019).
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1344633
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 34
social impact